AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Miniature Distilled BERT

# Miniature Distilled BERT

Rubert Tiny
MIT
An extremely compact distilled version (45MB, 12M parameters) of the bert-base-multilingual-cased model for Russian and English, prioritizing speed and size over absolute accuracy
Large Language Model Transformers Supports Multiple Languages
R
cointegrated
36.18k
41
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase